Initializing Bayesian Hyperparameter Optimization via Meta-Learning
نویسندگان
چکیده
Model selection and hyperparameter optimization is crucial in applying machine learning to a novel dataset. Recently, a subcommunity of machine learning has focused on solving this problem with Sequential Model-based Bayesian Optimization (SMBO), demonstrating substantial successes in many applications. However, for computationally expensive algorithms the overhead of hyperparameter optimization can still be prohibitive. In this paper we mimic a strategy human domain experts use: speed up optimization by starting from promising configurations that performed well on similar datasets. The resulting initialization technique integrates naturally into the generic SMBO framework and can be trivially applied to any SMBO method. To validate our approach, we perform extensive experiments with two established SMBO frameworks (Spearmint and SMAC) with complementary strengths; optimizing two machine learning frameworks on 57 datasets. Our initialization procedure yields mild improvements for lowdimensional hyperparameter optimization and substantially improves the state of the art for the more complex combined algorithm selection and hyperparameter optimization problem.
منابع مشابه
Learning to Warm-Start Bayesian Hyperparameter Optimization
Hyperparameter optimization undergoes extensive evaluations of validation errors in order to find the best configuration of hyperparameters. Bayesian optimization is now popular for hyperparameter optimization, since it reduces the number of validation error evaluations required. Suppose that we are given a collection of datasets on which hyperparameters are already tuned by either humans with ...
متن کاملUsing Meta-Learning to Initialize Bayesian Optimization of Hyperparameters
Model selection and hyperparameter optimization is crucial in applying machine learning to a novel dataset. Recently, a subcommunity of machine learning has focused on solving this problem with Sequential Model-based Bayesian Optimization (SMBO), demonstrating substantial successes in many applications. However, for expensive algorithms the computational overhead of hyperparameter optimization ...
متن کاملMeta-learning and Algorithm Selection Workshop
Model selection and hyperparameter optimization is cru-cial in applying machine learning to a novel dataset. Recently, a sub-community of machine learning has focused on solving this prob-lem with Sequential Model-based Bayesian Optimization (SMBO),demonstrating substantial successes in many applications. However,for expensive algorithms the computational overhead of hyperpa...
متن کاملCombination of Hyperband and Bayesian Optimization for Hyperparameter Optimization in Deep Learning
Deep learning has achieved impressive results on many problems. However, it requires high degree of expertise or a lot of experience to tune well the hyperparameters, and such manual tuning process is likely to be biased. Moreover, it is not practical to try out as many different hyperparameter configurations in deep learning as in other machine learning scenarios, because evaluating each singl...
متن کاملSupplementary material for: Initializing Bayesian Hyperparameter Optimization via Meta-Learning
To evaluate our approach in a realistic setting we implemented 46 metafeatures from the literature listed in Table 1.1 These metafeatures are computed only for the training set. While most of them can be computed for a whole dataset, some of them (e.g., skewness) are defined for each attribute of a dataset. In this case, we compute the metafeature for each attribute of the dataset and use the m...
متن کامل